ARE YOU A MEMBER OF: THE INVESTIGATION TEAM


"Take balanced parts of The FBI, The CIA,  The Associated Press, Wikileaks, Reuters, Drudge, Peer-To-Peer Forensic Experts and over 300 Million Voters, add a dash of 100% legal tactics and you have "The Investigation Team".

The charity based, non-commercial, Investigation Team is open to anyone who wants to participate in 100% legal crowd-powered law enforcement.

There is now a global Network of hundreds of P2P Social Mesh Network Ad Hoc organizations that you can participate in. We created the "Fake News" crisis by exposing every corporate news outlet that lies in the news to promote stock market rigging and payola. We got over 300 famous political figures fired for corruption, insider trading and payola scams.

YOU ARE US! JUST GET TO WORK ON INVESTIGATION PROJECTS ON THE WEB AND YOU ARE "IN"!

Join "The Other FBI" and contribute evidence files, 302-Form drafts and investigative reports to ALL law enforcement and media entities.

Join a special Team, ie: The Victims Of The Energy Department Crony Payola Scam; or form a new Team around a new case topic.

Read the training manuals and 'How To...' books and white-papers on-line simply by typing: "How To Be an Investigator" into any of the top 6 non-Google search engines.

You can use any tactic you can imagine, as creatively as possible, as long as you do not violate the Golden Rule:

"Never Break The Law, But Always Break The Law Breakers. Hunt Them Forever. Never Forget. Never Forgive Until They Have Paid For Their Crimes!"

Pass the word to every voter you know. Get folks involved in the effort. Look into the many groups in The Network like: Citizen SleuthsWeSearchr, Wikileaks, ICIJ, ProPublica, and thousands more...'


Crowdsourcing utilizes the input of a crowd of online users to collaboratively solve problems. To advance this emerging technology, researchers at the University of Miami are developing a computing model that uses crowdsourcing to combine and optimize human efforts and machine computing elements.
 
The new model can be used to efficiently perform the complex tasks of face recognition — a method used in law enforcement. It’s a new approach to using social networks as a formal part of the criminal investigation process, explained Brian Blake, vice provost for Academic Affairs and dean of the Graduate School at the University of Miami (UM). He is also principal investigator of the project.
 
“The breadth of the internet and popularity of smartphones have facilitated the onset of online crowdsourcing platforms,” Blake said. “Our project attempts to leverage the power of the crowd to solve complex problems, on demand.”
 
Preliminary findings showed an average certainty of 14.13 percent for machine computing elements (MCE) to identify individuals in pictures. Combining the efforts of both MCE and human computing elements (HCE) in performing the same task, there was an average increase of 55 percent reflecting in an overall mean precision of 69.13 percent.
 
Unlike previous works, where machines and humans would address distinct tasks, the current project is unique in that the investigators recognize that sometimes the jobs requiring human and/or machine interventions are exactly the same. Traditionally, in a criminal investigation, a workflow might use computers to sort pictures and information, then use the people to identify pictures of interest. In the current approach, people and machines could sort the images, simultaneously and people and machines could identify the images. The open problem is who does what and in which order.
 
The elasticity of the system is the key feature of the model. It allows it to adapt to changes in the workload of a task. The new model allows the machine and human computing resources to change, when and where needed and without disrupting the operations.
 
“Elastically, we would like to decide who is best for a specific task, or what concentration of people or machines could be mixed for a specific task,” Blake said. “We also think that humans could do a first pass of a task, then machines do a second pass on the same task.”
 
The model combines the human and machine computing elements and ignores specific outputs that don’t meet certain quality requirements. It also sends feedback to improve future decisions. This feedback allows the model to build a knowledge base to improve accuracy.
 
The preliminary results of this study were presented at the 10th IEEE International Conference on Collaborative Computing: Networking, Applications and Worksharing, where Blake gave the opening keynote address. The findings will be published by IEEE press in a paper titled: “Combining Human and Machine Computing Elements for Analysis via Crowdsourcing.”
 
The publication was led by Julian Jarrett, Ph.D. student, and Iman Saleh, research scientist working in Blake’s lab, in the Department of Computer Science at UM. Other co-authors are Rohan Malcom and Sean Thorpe, from the University of Technology in Jamaica; and Tyrone Grandison, from Proficiency Labs in Ashland, Oregon.
 
In the future, the researchers would like to recreate specific past events or create a simulated event, with a staged “bad guy,” where hundreds of people use their cell phones to take pictures, then use all the pictures and personal accounts, to test the new model.